Mar|kov chain

Mar|kov chain
Mar|kov chain «MAHR kf»,
Statistics. a succession of random events each of which is determined by the event immediately preceding it: »

In its simplest form, a Markov chain states that the probability of a succeeding event occurring is dependent upon the fact that a preceding event occurred. For example, if the letter Q is known to exist, what is the probability of it being followed by the letter U? (John P. Dowds).

[< Andrei Markov, 1856-1922, a Russian mathematician]

Useful english dictionary. 2012.

Игры ⚽ Нужна курсовая?

Look at other dictionaries:

  • Mar|kov|i|an — «mahr KOH vee uhn, K », adjective. of, having to do with, or based on a Markov chain or Markov process: »On the average, only about one sixth of a stock s price change is due to a common market factor. This type of process is said to be… …   Useful english dictionary

  • o — abi·o·log·i·cal; ab·o·li·tion; ab·o·li·tion·ary; ab·o·li·tion·dom; ab·o·li·tion·ism; ab·o·li·tion·ist; ab·o·li·tion·ize; ab·o·ma·sal; ab·o·ma·sum; ac·an·thol·o·gy; ac·an·thop·o·dous; acar·i·dol·o·gist; ac·a·ri·nol·o·gy; acar·i·o·sis;… …   English syllables

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”